Skip to main content

Home/ TOK Friends/ Group items tagged the Arts

Rss Feed Group items tagged

15More

5 things art is not - 3 views

  • 5 Things Art is Not
  • in over two decades of study through my work as a curator, college professor, and critic, art has become more mysterious, eluding my attempts to pin it down, to fully understand it, becoming so much more and other than I had expected.
  • But what I have learned are a few things about what art is not, things that I have believed about art over the years that I have gradually had to abandon.
  • ...12 more annotations...
  • 1. Art is not an abstract category.
  • You do not stand in front of "Art" in a museum or a gallery, you stand in front of particular painting. The artist Barnett Newman once quipped, "aesthetics is for artists what ornithology is for birds."
  • 2. Art is not a political weapon
  • The value of art is not found in its capacity to effect political change "out there" but to work on us—you and me—to connect with our pain and suffering, hope and yearning, to have the run of our "inner chambers,"
  • Any cultural practice that is worth taking seriously, that has a history and tradition, whether cooking, ballroom dancing, or chess, takes practice, and thus requires effort to learn
  • 3. Art is not easy
  • Each work of art is a response to a conversation that spans centuries, each artist receives and passes on this tradition in their own distinctive way
  • 4. Art is not a visual illustration of the artist’s worldview
  • No human being possesses a unified "worldview" that is manifest in and through each of her intentional acts or artifacts she produces
  • An artist does not paint a picture to express what she already knows or believes. She paints to learn something about herself and the world—something she doesn’t already know.
  • 5. Art does not form virtue
  • Art always pushes against the pragmatism, moralism, and utilitarianism that shapes life inside as well as outside the church. It starts with our weakness, desperation, and brokenness in its search for hope and beauty and stakes its existence and relevance on the belief that all appearances deceive.
17More

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
26More

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
27More

How the World's Oldest Wooden Sculpture Is Reshaping Prehistory - The New York Times - 0 views

  • How the World’s Oldest Wooden Sculpture Is Reshaping Prehistory
  • At 12,500 years old, the Shigir Idol is by far the earliest known work of ritual art. Only decay has kept others from being found.
  • The world’s oldest known wooden sculpture — a nine-foot-tall totem pole thousands of years old — looms over a hushed chamber of an obscure Russian museum in the Ural Mountains, not far from the Siberian border
  • ...24 more annotations...
  • Shigir Idol
  • Dug out of a peat bog by gold miners in 1890, the relic, or what’s left of it, is carved from a great slab of freshly cut larch.
  • Scattered among the geometric patterns (zigzags, chevrons, herringbones) are eight human faces, each with slashes for eyes that peer not so benignly from the front and back planes.
  • “Whether it screams or shouts or sings, it projects authority, possibly malevolent authority. It’s not immediately a friend of yours, much less an ancient friend of yours.”
  • In archaeology, portable prehistoric sculpture is called “mobiliary art.”
  • The statue’s age was a matter of conjecture until 1997, when it was carbon-dated by Russian scientists to about 9,500 years old, an age that struck most scholars as fanciful.
  • The statue was more than twice as old as the Egyptian pyramids and Stonehenge, as well as, by many millenniums, the first known work of ritual art.
  • A new study that Dr. Terberger wrote with some of the same colleagues in Quaternary International, further skews our understanding of prehistory by pushing back the original date of the Shigir Idol by another 900 years, placing it in the context of the early art in Eurasia.
  • “During the period of rapid cooling from about 10,700 B.C. to 9,600 B.C. that we call the Younger Dryas, no beavers should have been around in the Transurals,” he said.)
  • Written with an eye toward disentangling Western science from colonialism, Dr. Terberger’s latest paper challenges the ethnocentric notion that pretty much everything, including symbolic expression and philosophical perceptions of the world, came to Europe by way of the sedentary farming communities in the Fertile Crescent 8,000 years ago.
  • “It’s similar to the ‘Neanderthals did not make art’ fable, which was entirely based on absence of evidence,
  • Likewise, the overwhelming scientific consensus used to hold that modern humans were superior in key ways, including their ability to innovate, communicate and adapt to different environments.
  • Nonsense, all of it.”
  • makes it clear that arguments about the wealth of mobiliary art in, say, the Upper Paleolithic of Germany or France by comparison to southern Europe, are largely nonsensical and an artifact of tundra (where there are no trees and you use ivory, which is archaeologically visible) versus open forest environments
  • The Shigir Idol, named for the bog near Kirovgrad in which it was found, is presumed to have rested on a rock base for perhaps two or three decades before toppling into a long-gone paleo-lake, where the peat’s antimicrobial properties protected it like a time capsule.
  • “It was not a scientific construction,”
  • “The rings tell us that trees were growing very slowly, as the temperature was still quite cold,”
  • Dr. Terberger respectfully disagrees.
  • “The landscape changed, and the art — figurative designs and naturalistic animals painted in caves and carved in rock — did, too, perhaps as a way to help people come to grips with the challenging environments they encountered.”
  • And what do the engravings mean? Svetlana Savchenko, the artifact’s curator and an author on the study, speculates that the eight faces may well contain encrypted information about ancestor spirits, the boundary between earth and sky, or a creation myth.
  • The temple’s stones were carved around 11,000 years ago, which makes them 1,500 years younger than the Shigir Idol.
  • One could wonder how many similar pieces have been lost over time due to poor preservation conditions.”
  • The similarity of the geometric motifs to others across Europe in that era, he added, “is evidence of long-distance contacts and a shared sign language over vast areas. The sheer size of the idol also seems to indicate it was meant as a marker in the landscape that was supposed to be seen by other hunter-gatherer groups — perhaps marking the border of a territory, a warning or welcoming sign.”
  • “What do you think is the hardest thing to find in the Stone Age archaeology of the Urals?”A pause: Sites?“No,” he said, sighing softly. “Funding.”
22More

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
15More

Kung Fu for Philosophers - NYTimes.com - 0 views

  • any ability resulting from practice and cultivation could accurately be said to embody kung fu.
  • the predominant orientation of traditional Chinese philosophy is the concern about how to live one’s life, rather than finding out the truth about reality.
  • Confucius’s call for “rectification of names” — one must use words appropriately — is more a kung fu method for securing sociopolitical order than for capturing the essence of things, as “names,” or words, are placeholders for expectations of how the bearer of the names should behave and be treated. This points to a realization of what J. L. Austin calls the “performative” function of language.
  • ...12 more annotations...
  • Instead of leading to a search for certainty, as Descartes’s dream did, Zhuangzi came to the realization that he had perceived “the transformation of things,” indicating that one should go along with this transformation rather than trying in vain to search for what is real.
  • It even expands epistemology into the non-conceptual realm in which the accessibility of knowledge is dependent on the cultivation of cognitive abilities, and not simply on whatever is “publicly observable” to everyone. It also shows that cultivation of the person is not confined to “knowing how.” An exemplary person may well have the great charisma to affect others but does not necessarily know how to affect others.
  • The Buddhist doctrine of no-self surely looks metaphysical, but its real aim is to free one from suffering, since according to Buddhism suffering comes ultimately from attachment to the self. Buddhist meditations are kung fu practices to shake off one’s attachment, and not just intellectual inquiries for getting propositional truth.
  • The essence of kung fu — various arts and instructions about how to cultivate the person and conduct one’s life — is often hard to digest for those who are used to the flavor and texture of mainstream Western philosophy. It is understandable that, even after sincere willingness to try, one is often still turned away by the lack of clear definitions of key terms and the absence of linear arguments in classic Chinese texts. This, however, is not a weakness, but rather a requirement of the kung fu orientation — not unlike the way that learning how to swim requires one to focus on practice and not on conceptual understanding.
  • the views of Mencius and his later opponent Xunzi’s views about human nature are more recommendations of how one should view oneself in order to become a better person than metaphysical assertions about whether humans are by nature good or bad. Though each man’s assertions about human nature are incompatible with each other, they may still function inside the Confucian tradition as alternative ways of cultivation.
  • Western philosophy at its origin is similar to classic Chinese philosophy. The significance of this point is not merely in revealing historical facts. It calls our attention to a dimension that has been eclipsed by the obsession with the search for eternal, universal truth and the way it is practiced, namely through rational arguments.
  • One might well consider the Chinese kung fu perspective a form of pragmatism.  The proximity between the two is probably why the latter was well received in China early last century when John Dewey toured the country. What the kung fu perspective adds to the pragmatic approach, however, is its clear emphasis on the cultivation and transformation of the person, a dimension that is already in Dewey and William James but that often gets neglected
  • A kung fu master does not simply make good choices and use effective instruments to satisfy whatever preferences a person happens to have. In fact the subject is never simply accepted as a given. While an efficacious action may be the result of a sound rational decision, a good action that demonstrates kung fu has to be rooted in the entire person, including one’s bodily dispositions and sentiments, and its goodness is displayed not only through its consequences but also in the artistic style one does it. It also brings forward what Charles Taylor calls the “background” — elements such as tradition and community — in our understanding of the formation of a person’s beliefs and attitudes. Through the kung fu approach, classic Chinese philosophy displays a holistic vision that brings together these marginalized dimensions and thereby forces one to pay close attention to the ways they affect each other.
  • This kung fu approach shares a lot of insights with the Aristotelian virtue ethics, which focuses on the cultivation of the agent instead of on the formulation of rules of conduct. Yet unlike Aristotelian ethics, the kung fu approach to ethics does not rely on any metaphysics for justification.
  • This approach opens up the possibility of allowing multiple competing visions of excellence, including the metaphysics or religious beliefs by which they are understood and guided, and justification of these beliefs is then left to the concrete human experiences.
  • it is more appropriate to consider kung fu as a form of art. Art is not ultimately measured by its dominance of the market. In addition, the function of art is not accurate reflection of the real world; its expression is not constrained to the form of universal principles and logical reasoning, and it requires cultivation of the artist, embodiment of virtues/virtuosities, and imagination and creativity.
  • If philosophy is “a way of life,” as Pierre Hadot puts it, the kung fu approach suggests that we take philosophy as the pursuit of the art of living well, and not just as a narrowly defined rational way of life.
7More

Art Makes You Smart - NYTimes.com - 1 views

  • Through a large-scale, random-assignment study of school tours to the museum, we were able to determine that strong causal relationships do in fact exist between arts education and a range of desirable outcomes.
  • Students who, by lottery, were selected to visit the museum on a field trip demonstrated stronger critical thinking skills, displayed higher levels of social tolerance, exhibited greater historical empathy and developed a taste for art museums and cultural institutions.
  • Over the course of the following year, nearly 11,000 students and almost 500 teachers participated in our study, roughly half of whom had been selected by lottery to visit the museum
  • ...4 more annotations...
  • Applicant groups who won the lottery constituted our treatment group, while those who did not win an immediate tour served as our control group.
  • Several weeks after the students in the treatment group visited the museum, we administered surveys to all of the students. The surveys included multiple items that assessed knowledge about art, as well as measures of tolerance, historical empathy and sustained interest in visiting art museums and other cultural institutions. We also asked them to write an essay in response to a work of art that was unfamiliar to them.
  • Moreover, most of the benefits we observed are significantly larger for minority students, low-income students and students from rural schools — typically two to three times larger than for white, middle-class, suburban students — owing perhaps to the fact that the tour was the first time they had visited an art museum.
  • we can conclude that visiting an art museum exposes students to a diversity of ideas that challenge them with different perspectives on the human condition. Expanding access to art, whether through programs in schools or through visits to area museums and galleries, should be a central part of any school’s curriculum.
35More

Why Our Children Don't Think There Are Moral Facts - NYTimes.com - 1 views

  • I already knew that many college-aged students don’t believe in moral facts.
  • the overwhelming majority of college freshman in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.
  • where is the view coming from?
  • ...32 more annotations...
  • the Common Core standards used by a majority of K-12 programs in the country require that students be able to “distinguish among fact, opinion, and reasoned judgment in a text.”
  • So what’s wrong with this distinction and how does it undermine the view that there are objective moral facts?
  • For example, many people once thought that the earth was flat. It’s a mistake to confuse truth (a feature of the world) with proof (a feature of our mental lives)
  • Furthermore, if proof is required for facts, then facts become person-relative. Something might be a fact for me if I can prove it but not a fact for you if you can’t. In that case, E=MC2 is a fact for a physicist but not for me.
  • worse, students are taught that claims are either facts or opinions. They are given quizzes in which they must sort claims into one camp or the other but not both. But if a fact is something that is true and an opinion is something that is believed, then many claims will obviously be both
  • How does the dichotomy between fact and opinion relate to morality
  • Kids are asked to sort facts from opinions and, without fail, every value claim is labeled as an opinion.
  • Here’s a little test devised from questions available on fact vs. opinion worksheets online: are the following facts or opinions? — Copying homework assignments is wrong. — Cursing in school is inappropriate behavior. — All men are created equal. — It is worth sacrificing some personal liberties to protect our country from terrorism. — It is wrong for people under the age of 21 to drink alcohol. — Vegetarians are healthier than people who eat meat. — Drug dealers belong in prison.
  • Our children deserve a consistent intellectual foundation. Facts are things that are true. Opinions are things we believe. Some of our beliefs are true. Others are not. Some of our beliefs are backed by evidence. Others are not.
  • In summary, our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.
  • It should not be a surprise that there is rampant cheating on college campuses: If we’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong, we can’t very well blame them for doing so later on.
  • If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged? If there are no truths about what is good or valuable or right, how can we prosecute people for crimes against humanity? If it’s not true that all humans are created equal, then why vote for any political system that doesn’t benefit you over others?
  • the curriculum sets our children up for doublethink. They are told that there are no moral facts in one breath even as the next tells them how they ought to behave.
  • The answer? In each case, the worksheets categorize these claims as opinions. The explanation on offer is that each of these claims is a value claim and value claims are not facts. This is repeated ad nauseum: any claim with good, right, wrong, etc. is not a fact.
  • Professor McBrayer seems to miss the major point of the Common Core concern: can students distinguish between premises based on (reasonably construed) fact and premises based on emotion when evaluating conclusions? I would prefer that students learn to reason rather than be taught moral 'truth' that follows Professor McBrayer's logic.
  • The hard work lies not in recognizing that at least some moral claims are true but in carefully thinking through our evidence for which of the many competing moral claims is correct.
  • Moral truths are not the same as scientific truths or mathematical truths. Yet they may still be used a guiding principle for our individual lives as well as our laws.But there is equal danger of giving moral judgments the designation of truth as there is in not doing so. Many people believe that abortion is murder on the same level as shooting someone with a gun. But many others do not. So is it true that abortion is murder?Moral principles can become generally accepted and then form the basis for our laws. But many long accepted moral principles were later rejected as being faulty. "Separate but equal" is an example. Judging homosexual relationships as immoral is another example.
  • Whoa! That Einstein derived an equation is a fact. But the equation represents a theory that may have to be tweaked at some point in the future. It may be a fact that the equation foretold the violence of atomic explosions, but there are aspects of nature that elude the equation. Remember "the theory of everything?"
  • Here is a moral fact, this is a sermon masquerading as a philosophical debate on facts, opinions and truth. This professor of religion is asserting that the government via common core is teaching atheism via the opinion vs fact.He is arguing, in a dishonest form, that public schools should be teaching moral facts. Of course moral facts is code for the Ten Commandments.
  • As a fourth grade teacher, I try to teach students to read critically, including distinguishing between facts and opinions as they read (and have been doing this long before the Common Core arrived, by the way). It's not always easy for children to grasp the difference. I can only imagine the confusion that would ensue if I introduced a third category -- moral "facts" that can't be proven but are true nonetheless!
  • horrible acts occur not because of moral uncertainty, but because people are too sure that their views on morality are 100% true, and anyone who fails to recognize and submit themselves are heathens who deserve death.I can't think of any case where a society has suffered because people are too thoughtful and open-minded to different perspectives on moral truth.In any case, it's not an elementary school's job to teach "moral truths."
  • The characterization of moral anti-realism as some sort of fringe view in philosophy is misleading. Claims that can be true or false are, it seems, 'made true' by features of the world. It's not clear to many in philosophy (like me) just what features of the world could make our moral claims true. We are more likely to see people's value claims as making claims about, and enforcing conformity to, our own (contingent) social norms. This is not to hold, as Mr. McBrayer seems to think follows, that there are no reasons to endorse or criticize these social norms.
  • This is nonsense. Giving kids the tools to distinguish between fact and opinion is hard enough in an age when Republicans actively deny reality on Fox News every night. The last thing we need is to muddy their thinking with the concept of "moral facts."A fact is a belief that everyone _should_ agree upon because it is observable and testable. Morals are not agreed upon by all. Consider the hot button issue of abortion.
  • Truthfully, I'm not terribly concerned that third graders will end up taking these lessons in the definition of fact versus opinion to the extremes considered here, or take them as a license to cheat. That will come much later, when they figure out, as people always have, what they can get a way with. But Prof. McBrayer, with his blithe expectation that all the grownups know that there moral "facts"? He scares the heck out of me.
  • I've long chafed at the language of "fact" v. "opinion", which is grounded in a very particular, limited view of human cognition. In my own ethics courses, I work actively to undermine the distinction, focusing instead on considered judgment . . . or even more narrowly, on consideration itself. (See http://wp.me/p5Ag0i-6M )
  • The real waffle here is the very concept of "moral facts." Our statements of values, even very important ones are, obviously, not facts. Trying to dress them up as if they are facts, to me, argues for a pretty serious moral weakness on the part of those advancing the idea.
  • Our core values are not important because they are facts. They are important because we collectively hold them and cherish them. To lean on the false crutch of "moral facts" to admit the weakness of your own moral convictions.
  • I would like to believe that there is a core of moral facts/values upon which all humanity can agree, but it would be tough to identify exactly what those are.
  • For the the ancient philosophers, reality comprised the Good, the True, and the Beautiful (what we might now call ethics, science and art), seeing these as complementary and inseparable, though distinct, realms. With the ascendency of science in our culture as the only valid measure of reality to the detriment of ethics and art (that is, if it is not observable and provable, it is not real), we have turned the good and the beautiful into mere "social constructs" that have no validity on their own. While I am sympathetic in many ways with Dr. McBrayer's objections, I think he falls into the trap of discounting the Good and The Beautiful as valid in and of themselves, and tries, instead, to find ways to give them validity through the True. I think his argument would have been stronger had he used the language of validity rather than the language of truth. Goodness, Truth and Beauty each have their own validity, though interdependent and inseparable. When we artificially extract one of these and give it primacy, we distort reality and alienate ourselves from it.
  • Value claims are like any other claims: either true or false, evidenced or not.
  • Moral issues cannot scientifically be treated on the level that Prof. McBrayer is attempting to use in this column: true or false, fact or opinion or both. Instead, they should be treated as important characteristics of the systematic working of a society or of a group of people in general. One can compare the working of two groups of people: one in which e.g. cheating and lying is acceptable, and one in which they are not. One can use historical or model examples to show the consequences and the working of specific systems of morals. I think that this method - suitably adjusted - can be used even in second grade.
  • Relativism has nothing to do with liberalism. The second point is that I'm not sure it does all that much harm, because I have yet to encounter a student who thought that he or she had to withhold judgment on those who hold opposing political views!
23More

Liu Cixin's War of the Worlds | The New Yorker - 0 views

  • he briskly dismissed the idea that fiction could serve as commentary on history or on current affairs. “The whole point is to escape the real world!” he said.
  • Chinese tech entrepreneurs discuss the Hobbesian vision of the trilogy as a metaphor for cutthroat competition in the corporate world; other fans include Barack Obama, who met Liu in Beijing two years ago, and Mark Zuckerberg. Liu’s international career has become a source of national pride. In 2015, China’s then Vice-President, Li Yuanchao, invited Liu to Zhongnanhai—an off-limits complex of government accommodation sometimes compared to the Kremlin—to discuss the books and showed Liu his own copies, which were dense with highlights and annotations.
  • In China, one of his stories has been a set text in the gao kao—the notoriously competitive college-entrance exams that determine the fate of ten million pupils annually; another has appeared in the national seventh-grade-curriculum textbook. When a reporter recently challenged Liu to answer the middle-school questions about the “meaning” and the “central themes” of his story, he didn’t get a single one right. “I’m a writer,” he told me, with a shrug.
  • ...20 more annotations...
  • Liu’s tomes—they tend to be tomes—have been translated into more than twenty languages, and the trilogy has sold some eight million copies worldwide. He has won China’s highest honor for science-fiction writing, the Galaxy Award, nine times, and in 2015 he became the first Asian writer to win the Hugo Award, the most prestigious international science-fiction prize
  • Liu believes that this trend signals a deeper shift in the Chinese mind-set—that technological advances have spurred a new excitement about the possibilities of cosmic exploration.
  • Concepts that seemed abstract to others took on, for him, concrete forms; they were like things he could touch, inducing a “druglike euphoria.” Compared with ordinary literature, he came to feel, “the stories of science are far more magnificent, grand, involved, profound, thrilling, strange, terrifying, mysterious, and even emotional
  • Pragmatic choices like this one, or like the decision his grandparents made when their sons were conscripted, recur in his fiction—situations that present equally unconscionable choices on either side of a moral fulcrum
  • The great flourishing of science fiction in the West at the end of the nineteenth century occurred alongside unprecedented technological progress and the proliferation of the popular press—transformations that were fundamental to the development of the genre
  • Joel Martinsen, the translator of the second volume of Liu’s trilogy, sees the series as a continuation of this tradition. “It’s not hard to read parallels between the Trisolarans and imperialist designs on China, driven by hunger for resources and fear of being wiped out,” he told me. Even Liu, unwilling as he is to endorse comparisons between the plot and China’s current face-off with the U.S., did at one point let slip that “the relationship between politics and science fiction cannot be underestimated.”
  • Speculative fiction is the art of imagining alternative worlds, and the same political establishment that permits it to be used as propaganda for the existing regime is also likely to recognize its capacity to interrogate the legitimacy of the status quo.
  • Liu has been criticized for peopling his books with characters who seem like cardboard cutouts installed in magnificent dioramas. Liu readily admits to the charge. “I did not begin writing for love of literature,” he told me. “I did so for love of science.”
  • “The Three-Body Problem” takes its title from an analytical problem in orbital mechanics which has to do with the unpredictable motion of three bodies under mutual gravitational pull. Reading an article about the problem, Liu thought, What if the three bodies were three suns? How would intelligent life on a planet in such a solar system develop? From there, a structure gradually took shape that almost resembles a planetary system, with characters orbiting the central conceit like moons. For better or worse, the characters exist to support the framework of the story rather than to live as individuals on the page.
  • Liu’s imagination is dauntingly capacious, his narratives conceived on a scale that feels, at times, almost hallucinogenic. The time line of the trilogy spans 18,906,450 years, encompassing ancient Egypt, the Qin dynasty, the Byzantine Empire, the Cultural Revolution, the present, and a time eighteen million years in the future
  • The first book is set on Earth, though some of its scenes take place in virtual reality; by the end of the third book, the scope of the action is interstellar and annihilation unfolds across several dimensions. The London Review of Books has called the trilogy “one of the most ambitious works of science fiction ever written.”
  • Although physics furnishes the novels’ premises, it is politics that drives the plots. At every turn, the characters are forced to make brutal calculations in which moral absolutism is pitted against the greater good
  • In Liu’s fictional universe, idealism is fatal and kindness an exorbitant luxury. As one general says in the trilogy, “In a time of war, we can’t afford to be too scrupulous.” Indeed, it is usually when people do not play by the rules of Realpolitik that the most lives are lost.
  • “I know what you are thinking,” he told me with weary clarity. “What about individual liberty and freedom of governance?” He sighed, as if exhausted by a debate going on in his head. “But that’s not what Chinese people care about. For ordinary folks, it’s the cost of health care, real-estate prices, their children’s education. Not democracy.”
  • Liu closed his eyes for a long moment and then said quietly, “This is why I don’t like to talk about subjects like this. The truth is you don’t really—I mean, can’t truly—understand.”
  • Liu explained to me, the existing regime made the most sense for today’s China, because to change it would be to invite chaos. “If China were to transform into a democracy, it would be hell on earth,”
  • It was an opinion entirely consistent with his systems-level view of human societies, just as mine reflected a belief in democracy and individualism as principles to be upheld regardless of outcomes
  • “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark. Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains.
  • Chinese people of his generation were lucky, he said. The changes they had seen were so huge that they now inhabited a world entirely different from that of their childhood. “China is a futuristic country,” he said. “I realized that the world around me became more and more like science fiction, and this process is speeding up.”
  • “We have statues of a few martyrs, but we never—We don’t memorialize those, the individuals.” He took off his glasses and blinked, peering into the wide expanse of green and concrete. “This is how we Chinese have always been,” he said. “When something happens, it passes, and time buries the stories.”
30More

Among the Disrupted - The New York Times - 0 views

  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university,
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods
  • ...27 more annotations...
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • We are not becoming transhumanists, obviously. We are too singular for the Singularity. But are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.
  • Here is his conclusion: “Anytime your inquiries lead you to say, ‘At this moment we must ask and decide who we fundamentally are, our solution and salvation must lie in a new picture of ourselves and humanity, this is our profound responsibility and a new opportunity’ — just stop.” Greif seems not to realize that his own book is a lasting monument to precisely such inquiry, and to its grandeur
  • “Answer, rather, the practical matters,” he counsels, in accordance with the current pragmatist orthodoxy. “Find the immediate actions necessary to achieve an aim.” But before an aim is achieved, should it not be justified? And the activity of justification may require a “picture of ourselves.” Don’t just stop. Think harder. Get it right.
  • Greif’s book is a prehistory of our predicament, of our own “crisis of man.” (The “man” is archaic, the “crisis” is not.) It recognizes that the intellectual history of modernity may be written in part as the epic tale of a series of rebellions against humanism
  • Who has not felt superior to humanism? It is the cheapest target of all: Humanism is sentimental, flabby, bourgeois, hypocritical, complacent, middlebrow, liberal, sanctimonious, constricting and often an alibi for power
  • what is humanism? For a start, humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating
  • The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality
  • a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion
  • And posthumanism? It elects to understand the world in terms of impersonal forces and structures, and to deny the importance, and even the legitimacy, of human agency.
  • There have been humane posthumanists and there have been inhumane humanists. But the inhumanity of humanists may be refuted on the basis of their own worldview
  • the condemnation of cruelty toward “man the machine,” to borrow the old but enduring notion of an 18th-century French materialist, requires the importation of another framework of judgment. The same is true about universalism, which every critic of humanism has arraigned for its failure to live up to the promise of a perfect inclusiveness
  • there has never been a universalism that did not exclude. Yet the same is plainly the case about every particularism, which is nothing but a doctrine of exclusion; and the correction of particularism, the extension of its concept and its care, cannot be accomplished in its own name. It requires an idea from outside, an idea external to itself, a universalistic idea, a humanistic idea.
  • Asking universalism to keep faith with its own principles is a perennial activity of moral life. Asking particularism to keep faith with its own principles is asking for trouble.
  • there is no more urgent task for American intellectuals and writers than to think critically about the salience, even the tyranny, of technology in individual and collective life
  • Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.
  • “Our very mastery seems to escape our mastery,” Michel Serres has anxiously remarked. “How can we dominate our domination; how can we master our own mastery?”
  • universal accessibility is not the end of the story, it is the beginning. The humanistic methods that were practiced before digitalization will be even more urgent after digitalization, because we will need help in navigating the unprecedented welter
  • Searches for keywords will not provide contexts for keywords. Patterns that are revealed by searches will not identify their own causes and reasons
  • The new order will not relieve us of the old burdens, and the old pleasures, of erudition and interpretation.
  • Is all this — is humanism — sentimental? But sentimentality is not always a counterfeit emotion. Sometimes sentiment is warranted by reality.
  • The persistence of humanism through the centuries, in the face of formidable intellectual and social obstacles, has been owed to the truth of its representations of our complexly beating hearts, and to the guidance that it has offered, in its variegated and conflicting versions, for a soulful and sensitive existence
  • a complacent humanist is a humanist who has not read his books closely, since they teach disquiet and difficulty. In a society rife with theories and practices that flatten and shrink and chill the human subject, the humanist is the dissenter.
2More

How To Look Smart, Ctd - The Daily Dish | By Andrew Sullivan - 0 views

  • The Atlantic Home todaysDate();Tuesday, February 8, 2011Tuesday, February 8, 2011 Go Follow the Atlantic » Politics Presented by When Ronald Reagan Endorsed Ron Paul Joshua Green Epitaph for the DLC Marc Ambinder A Hard Time Raising Concerns About Egypt Chris Good Business Presented by Could a Hybrid Mortgage System Work? Daniel Indiviglio Fighting Bias in Academia Megan McArdle The Tech Revolution For Seniors Derek Thompson Culture Presented By 'Tiger Mother' Creates a New World Order James Fallows Justin Bieber: Daydream Believer James Parker <!-- /li
  • these questions tend to overlook the way IQ tests are designed. As a neuropsychologist who has administered hundreds of these measures, I can tell you that their structures reflect a deeply embedded bias toward intelligence as a function of reading skills
71More

Opinion | How to be Human - The New York Times - 0 views

  • I have learned something profound along the way. Being openhearted is a prerequisite for being a full, kind and wise human being. But it is not enough. People need social skills
  • The real process of, say, building a friendship or creating a community involves performing a series of small, concrete actions well: being curious about other people; disagreeing without poisoning relationships; revealing vulnerability at an appropriate pace; being a good listener; knowing how to ask for and offer forgiveness; knowing how to host a gathering where everyone feels embraced; knowing how to see things from another’s point of view.
  • People want to connect. Above almost any other need, human beings long to have another person look into their faces with love and acceptance
  • ...68 more annotations...
  • we lack practical knowledge about how to give one another the attention we crave
  • Some days it seems like we have intentionally built a society that gives people little guidance on how to perform the most important activities of life.
  • If I can shine positive attention on others, I can help them to blossom. If I see potential in others, they may come to see potential in themselves. True understanding is one of the most generous gifts any of us can give to another.
  • I see the results, too, in the epidemic of invisibility I encounter as a journalist. I often find myself interviewing people who tell me they feel unseen and disrespected
  • I’ve been working on a book called “How to Know a Person: The Art of Seeing Others Deeply and Being Deeply Seen.” I wanted it to be a practical book — so that I would learn these skills myself, and also, I hope, teach people how to understand others, how to make them feel respected, valued and understood.
  • I wanted to learn these skills for utilitarian reasons
  • If I’m going to work with someone, I don’t just want to see his superficial technical abilities. I want to understand him more deeply — to know whether he is calm in a crisis, comfortable with uncertainty or generous to colleagues.
  • I wanted to learn these skills for moral reasons
  • Many of the most productive researchers were in the habit of having breakfast or lunch with an electrical engineer named Harry Nyquist. Nyquist really listened to their challenges, got inside their heads, brought out the best in them. Nyquist, too, was an illuminator.
  • Finally, I wanted to learn these skills for reasons of national survival
  • We evolved to live with small bands of people like ourselves. Now we live in wonderfully diverse societies, but our social skills are inadequate for the divisions that exist. We live in a brutalizing time.
  • In any collection of humans, there are diminishers and there are illuminators. Diminishers are so into themselves, they make others feel insignificant
  • They stereotype and label. If they learn one thing about you, they proceed to make a series of assumptions about who you must be.
  • Illuminators, on the other hand, have a persistent curiosity about other people.
  • hey have been trained or have trained themselves in the craft of understanding others. They know how to ask the right questions at the right times — so that they can see things, at least a bit, from another’s point of view. They shine the brightness of their care on people and make them feel bigger, respected, lit up.
  • A biographer of the novelist E.M. Forster wrote, “To speak with him was to be seduced by an inverse charisma, a sense of being listened to with such intensity that you had to be your most honest, sharpest, and best self.” Imagine how good it would be to offer people that kind of hospitality.
  • social clumsiness I encounter too frequently. I’ll be leaving a party or some gathering and I’ll realize: That whole time, nobody asked me a single question. I estimate that only 30 percent of the people in the world are good question askers. The rest are nice people, but they just don’t ask. I think it’s because they haven’t been taught to and so don’t display basic curiosity about others.
  • Many years ago, patent lawyers at Bell Labs were trying to figure out why some employees were much more productive than others.
  • Illuminators are a joy to be around
  • The gift of attention.
  • Each of us has a characteristic way of showing up in the world. A person who radiates warmth will bring out the glowing sides of the people he meets, while a person who conveys formality can meet the same people and find them stiff and detached. “Attention,” the psychiatrist Iain McGilchrist writes, “is a moral act: It creates, brings aspects of things into being.”
  • When Jimmy sees a person — any person — he is seeing a creature with infinite value and dignity, made in the image of God. He is seeing someone so important that Jesus was willing to die for that person.
  • Accompaniment.
  • Accompaniment is an other-centered way of being with people during the normal routines of life.
  • If we are going to accompany someone well, we need to abandon the efficiency mind-set. We need to take our time and simply delight in another person’s way of being
  • I know a couple who treasure friends who are what they call “lingerable.” These are the sorts of people who are just great company, who turn conversation into a form of play and encourage you to be yourself. It’s a great talent, to be lingerable.
  • Other times, a good accompanist does nothing more than practice the art of presence, just being there.
  • The art of conversation.
  • If you tell me something important and then I paraphrase it back to you, what psychologists call “looping,” we can correct any misimpressions that may exist between us.
  • Be a loud listener. When another person is talking, you want to be listening so actively you’re burning calories.
  • He’s continually responding to my comments with encouraging affirmations, with “amen,” “aha” and “yes!” I love talking to that guy.
  • I no longer ask people: What do you think about that? Instead, I ask: How did you come to believe that? That gets them talking about the people and experiences that shaped their values.
  • Storify whenever possible
  • People are much more revealing and personal when they are telling stories.
  • Do the looping, especially with adolescents
  • If you want to know how the people around you see the world, you have to ask them. Here are a few tips I’ve collected from experts on how to become a better conversationalist:
  • Turn your partner into a narrator
  • People don’t go into enough detail when they tell you a story. If you ask specific follow-up questions — Was your boss screaming or irritated when she said that to you? What was her tone of voice? — then they will revisit the moment in a more concrete way and tell a richer story
  • If somebody tells you he is having trouble with his teenager, don’t turn around and say: “I know exactly what you mean. I’m having incredible problems with my own Susan.” You may think you’re trying to build a shared connection, but what you are really doing is shifting attention back to yourself.
  • Don’t be a topper
  • Big questions.
  • The quality of your conversations will depend on the quality of your questions
  • As adults, we get more inhibited with our questions, if we even ask them at all. I’ve learned we’re generally too cautious. People are dying to tell you their stories. Very often, no one has ever asked about them.
  • So when I first meet people, I tend to ask them where they grew up. People are at their best when talking about their childhoods. Or I ask where they got their names. That gets them talking about their families and ethnic backgrounds.
  • After you’ve established trust with a person, it’s great to ask 30,000-foot questions, ones that lift people out of their daily vantage points and help them see themselves from above.
  • These are questions like: What crossroads are you at? Most people are in the middle of some life transition; this question encourages them to step back and describe theirs
  • I’ve learned it’s best to resist this temptation. My first job in any conversation across difference or inequality is to stand in other people’s standpoint and fully understand how the world looks to them. I’ve found it’s best to ask other people three separate times and in three different ways about what they have just said. “I want to understand as much as possible. What am I missing here?”
  • Can you be yourself where you are and still fit in? And: What would you do if you weren’t afraid? Or: If you died today, what would you regret not doing?
  • “What have you said yes to that you no longer really believe in?
  • “What is the no, or refusal, you keep postponing?”
  • “What is the gift you currently hold in exile?,” meaning, what talent are you not using
  • “Why you?” Why was it you who started that business? Why was it you who ran for school board? She wants to understand why a person felt the call of responsibility. She wants to understand motivation.
  • “How do your ancestors show up in your life?” But it led to a great conversation in which each of us talked about how we’d been formed by our family heritages and cultures. I’ve come to think of questioning as a moral practice. When you’re asking good questions, you’re adopting a posture of humility, and you’re honoring the other person.
  • Stand in their standpoint
  • I used to feel the temptation to get defensive, to say: “You don’t know everything I’m dealing with. You don’t know that I’m one of the good guys here.”
  • If the next five years is a chapter in your life, what is the chapter about?
  • every conversation takes place on two levels
  • The official conversation is represented by the words we are saying on whatever topic we are talking about. The actual conversations occur amid the ebb and flow of emotions that get transmitted as we talk. With every comment I am showing you respect or disrespect, making you feel a little safer or a little more threatened.
  • If we let fear and a sense of threat build our conversation, then very quickly our motivations will deteriorate
  • If, on the other hand, I show persistent curiosity about your viewpoint, I show respect. And as the authors of “Crucial Conversations” observe, in any conversation, respect is like air. When it’s present nobody notices it, and when it’s absent it’s all anybody can think about.
  • the novelist and philosopher Iris Murdoch argued that the essential moral skill is being considerate to others in the complex circumstances of everyday life. Morality is about how we interact with each other minute by minute.
  • I used to think the wise person was a lofty sage who doled out life-altering advice in the manner of Yoda or Dumbledore or Solomon. But now I think the wise person’s essential gift is tender receptivity.
  • The illuminators offer the privilege of witness. They take the anecdotes, rationalizations and episodes we tell and see us in a noble struggle. They see the way we’re navigating the dialectics of life — intimacy versus independence, control versus freedom — and understand that our current selves are just where we are right now on our long continuum of growth.
  • The really good confidants — the people we go to when we are troubled — are more like coaches than philosopher kings.
  • They take in your story, accept it, but prod you to clarify what it is you really want, or to name the baggage you left out of your clean tale.
  • They’re not here to fix you; they are here simply to help you edit your story so that it’s more honest and accurate. They’re here to call you by name, as beloved
  • They see who you are becoming before you do and provide you with a reputation you can then go live into.
  • there has been a comprehensive shift in my posture. I think I’m more approachable, vulnerable. I know more about human psychology than I used to. I have a long way to go, but I’m evidence that people can change, sometimes dramatically, even in middle and older age.
16More

How Engaging With Art Affects the Human Brain | American Association for the Advancemen... - 0 views

  • Today, the neurological mechanisms underlying these responses are the subject of fascination to artists, curators and scientists alike.
  • "Once you circle these little things and come to the end of this little project, you'll be invited to compare where you came out against what the results of this experiment were and are," Vikan said. "What you'll find in this show is that there is an amazing convergence. The people that came to the museum liked and disliked the same categories of shapes as the people in the lab as the people in the fMRIs."
  • "Art accesses some of the most advanced processes of human intuitive analysis and expressivity and a key form of aesthetic appreciation is through embodied cognition, the ability to project oneself as an agent in the depicted scene,
  • ...13 more annotations...
  • Embodied cognition is "the sense of drawing you in and making you really feel the quality of the paintings,"
  • The Birth of Venus" because it makes them feel as though they are floating in with Venus on the seashell. Similarly, viewers can feel the flinging of the paint on the canvas when appreciating a drip painting by Jackson Pollock.
  • Mirror neurons, cells in the brain that respond similarly when observing and performing an action, are responsible for embodied cognition
  • Most research on the effects of music education has been done on populations that are privileged enough to afford private music instruction so Kraus is studying music instruction in group settings
  • "But observing the action requires the information to flow inward from the image you're seeing into the control centers. So that bidirectional flow is what's captured in this concept of mirror neurons and it gives the extra vividness to this aesthetics of art appreciation
  • Artists are known to be better observers and exhibit better memory than non-artists. In an effort to see what happens in the brain when an individual is drawing and whether drawing can increase the brain's plasticity
  • While congenitally blind people usually don't have activation in the visual area of the brain, in brain scans done after the subjects were taught to draw from memory,
  • Hearing speech in noise is one area in which musicians are uniquely skilled. In standardized tests, musicians across the lifespan were much better than the general public at listening to sentences and repeating them back as the level of background noise increased, Kraus said.
  • Performing an action requires the information to flow out from the control centers to the limbs,
  • Musicians are also known for their ability to keep rhythm, a skill that is correlated with reading ability and how precisely the brain responds to sound. After one year, students who participated in the group music instruction were faster and more accurate at keeping a beat than students in the control group, Kraus said.
  • "To sum things up, we are what we do and our past shapes our present," Kraus said. "Auditory biology is not frozen in time. It's a moving target. And music education really does seem to enhance communication by strengthening language skills."
  • "When you're doing art, your brain is running full speed,"
  • "It's hitting on all eight cylinders. So if you can figure out what's happening to the brain on art,
30More

The &quot;missing law&quot; of nature was here all along | Salon.com - 0 views

  • recently published scientific article proposes a sweeping new law of nature, approaching the matter with dry, clinical efficiency that still reads like poetry.
  • “Evolving systems are asymmetrical with respect to time; they display temporal increases in diversity, distribution, and/or patterned behavior,” they continue, mounting their case from the shoulders of Charles Darwin, extending it toward all things living and not.&nbsp;
  • To join the known physics laws of thermodynamics, electromagnetism and Newton’s laws of motion and gravity, the nine scientists and philosophers behind the paper propose their “law of increasing functional information.”
  • ...27 more annotations...
  • In short, a complex and evolving system — whether that’s a flock of gold finches or a nebula or the English language — will produce ever more diverse and intricately detailed states and configurations of itself.
  • Some of these more diverse and intricate configurations, the scientists write, are shed and forgotten over time. The configurations that persist are ones that find some utility or novel function in a process akin to natural selection, but a selection process driven by the passing-on of information rather than just the sowing of biological genes
  • Have they finally glimpsed, I wonder, the connectedness and symbiotic co-evolution of their own scientific ideas with those of the world’s writers
  • Have they learned to describe in their own quantifying language that cradle from which both our disciplines have emerged and the firmament on which they both stand — the hearing and telling of stories in order to exist?
  • Have they quantified the quality of all existent matter, living and not: that all things inherit a story in data to tell, and that our stories are told by the very forms we take to tell them?&nbsp;
  • “Is there a universal basis for selection? Is there a more quantitative formalism underlying this conjectured conceptual equivalence—a formalism rooted in the transfer of information?,” they ask of the world’s disparate phenomena. “The answer to both questions is yes.”
  • Yes. They’ve glimpsed it, whether they know it or not. Sing to me, O Muse, of functional information and its complex diversity.
  • The principle of complexity evolving at its own pace when left to its own devices, independent of time but certainly in a dance with it, is nothing new. Not in science, nor in its closest humanities kin, science and nature writing. Give things time and nourishing environs, protect them from your own intrusions and — living organisms or not — they will produce abundant enlacement of forms.
  • This is how poetry was born from the same larynxes and phalanges that tendered nuclear equations: We featherless bipeds gave language our time and delighted attendance until its forms were so multivariate that they overflowed with inevitable utility.
  • In her Pulitzer-winning “Pilgrim at Tinker Creek,” nature writer Annie Dillard explains plainly that evolution is the vehicle of such intricacy in the natural world, as much as it is in our own thoughts and actions.&nbsp;
  • “The stability of simple forms is the sturdy base from which more complex, stable forms might arise, forming in turn more complex forms,” she explains, drawing on the undercap frills of mushrooms and filament-fine filtering tubes inside human kidneys to illustrate her point.&nbsp;
  • “Utility to the creature is evolution’s only aesthetic consideration. Form follows function in the created world, so far as I know, and the creature that functions, however bizarre, survives to perpetuate its form,” writes Dillard.
  • “Of the multiplicity of forms, I know nothing. Except that, apparently, anything goes. This holds for forms of behavior as well as design — the mantis munching her mate, the frog wintering in mud.”&nbsp;
  • She notes that, of all forms of life we’ve ever known to exist, only about 10% are still alive. What extravagant multiplicity.&nbsp;
  • “Intricacy is that which is given from the beginning, the birthright, and in the intricacy is the hardiness of complexity that ensures against the failures of all life,” Dillard writes. “The wonder is — given the errant nature of freedom and the burgeoning texture of time — the wonder is that all the forms are not monsters, that there is beauty at all, grace gratuitous.”
  • “This paper, and the reason why I'm so proud of it, is because it really represents a connection between science and the philosophy of science that perhaps offers a new lens into why we see everything that we see in the universe,” lead scientist Michael Wong told Motherboard in a recent interview.&nbsp;
  • Wong is an astrobiologist and planetary scientist at the Carnegie Institute for Science. In his team’s paper, that bridge toward scientific philosophy is not only preceded by a long history of literary creativity but directly theorizes about the creative act itself. &nbsp;
  • “The creation of art and music may seem to have very little to do with the maintenance of society, but their origins may stem from the need to transmit information and create bonds among communities, and to this day, they enrich life in innumerable ways,” Wong’s team writes. &nbsp;
  • “Perhaps, like eddies swirling off of a primary flow field, selection pressures for ancillary functions can become so distant from the core functions of their host systems that they can effectively be treated as independently evolving systems,” the authors add, pointing toward the elaborate mating dance culture observed in birds of paradise.
  • “Perhaps it will be humanity’s ability to learn, invent, and adopt new collective modes of being that will lead to its long-term persistence as a planetary phenomenon. In light of these considerations, we suspect that the general principles of selection and function discussed here may also apply to the evolution of symbolic and social systems.”
  • The Mekhilta teaches that all Ten Commandments were pronounced in a single utterance. Similarly, the Maharsha says the Torah’s 613 mitzvoth are only perceived as a plurality because we’re time-bound humans, even though they together form a singular truth which is indivisible from He who expressed it.&nbsp;
  • Or, as the Mishna would have it, “the creations were all made in generic form, and they gradually expanded.”&nbsp;
  • Like swirling eddies off of a primary flow field.
  • “O Lord, how manifold are thy works!,” cried out David in his psalm. “In wisdom hast thou made them all: the earth is full of thy riches. So is this great and wide sea, wherein are things creeping innumerable, both small and great beasts.”&nbsp;
  • In all things, then — from poetic inventions, to rare biodiverse ecosystems, to the charted history of our interstellar equations — it is best if we conserve our world’s intellectual and physical diversity, for both the study and testimony of its immeasurable multiplicity.
  • Because, whether wittingly or not, science is singing the tune of the humanities. And whether expressed in algebraic logic or ancient Greek hymn, its chorus is the same throughout the universe: Be fruitful and multiply.&nbsp;
  • Both intricate configurations of art and matter arise and fade according to their shared characteristic, long-known by students of the humanities: each have been graced with enough time to attend to the necessary affairs of their most enduring pleasures.&nbsp;
17More

Coronavirus Live Updates: China Is Tracking Travelers From Hubei - The New York Times - 0 views

  • To combat the spread of the coronavirus, Chinese officials are using a combination of technology and policing to track movements of citizens who may have visited Hubei Province.
  • Mobile phone owners in China get their service from one of three state-run telecommunications firms, which this week introduced a feature for subscribers to send text messages to a hotline that generates a list of provinces they have recently visited. That has created a new way for the authorities to see where citizens have traveled.At a high-speed rail station in the eastern city of Yiwu on Tuesday, officials in hazmat suits demanded that passengers send the text messages and then show their location information to the authorities before being permitted to leave the station. Those who had passed through Hubei were unlikely to be allowed entry.
  • Top officials in Beijing on Thursday expanded their mass roundup of sick or possibly infected people beyond Wuhan, the city at the center of the outbreak, to include other cities in Hubei Province that have been hit hard by the crisis, according to the state-run CCTV broadcaster.
  • ...14 more annotations...
  • Chinese officials reported Friday that a surge in new infections was continuing, though not as markedly as the day before, when the number of people confirmed to have the virus in Hubei Province skyrocketed by 14,840 cases.
  • Japan has confirmed its first death from the virus.
  • For a moment on Thursday, it seemed as if there might be some good news from the Diamond Princess, the cruise ship being held in the port of Yokohama in Japan, when the authorities said they would release some passengers to shore to finish their quarantine.Instead, Japanese health officials announced the first death from the virus in the country, of a woman in her 80s. It was third death from the virus outside mainland China. The woman had no record of travel there.
  • The Centers for Disease Control said Thursday that a person under quarantine at a military base in San Antonio had tested positive for the virus, bringing the number of confirmed coronavirus patients in the United States to 15.
  • For the first time in a decade, global oil demand is expected to fall.
  • The travel industry in Asia has been upended.Image
  • Movie releases have been canceled in China and symphony tours suspended. A major art fair in Hong Kong was called off. And spring art auctions half a world away in New York have been postponed because well-heeled Chinese buyers may find it difficult to travel to them.
  • The U.S. reported its 15th case after a person under quarantine tested positive.
  • The arts world, too, is feeling the squeeze.Image
  • China ousted a provincial leader at the center of the outbreak.
  • China’s leader, Xi Jinping, on Thursday summarily fired two top Communist Party officials from Hubei Province, exacting political punishment for the regional government’s handling of the crisis.
  • A second citizen-journalist in Wuhan has disappeared.
  • A video blogger in the city of Wuhan who had been documenting conditions at overcrowded hospitals at the heart of the outbreak has disappeared, raising concerns among his supporters that he may have been detained by the authorities.The blogger, Fang Bin, is the second citizen journalist in the city to have gone missing in a week after criticizing the government’s response to the coronavirus epidemic.Mr. Fang began posting videos from hospitals in Wuhan on YouTube last month, including one that showed a pile of body bags in a minibus. In early February, Mr. Fang said he had been briefly detained and questioned. A few days later, he filmed an exchange he had with strangers who showed up at his apartment claiming to bring him food.Mr. Fang’s last video, posted on Sunday, was a message written on a piece of paper: “All citizens resist, hand power back to the people.”Last week, Chen Qiushi, a citizen-journalist and lawyer in Wuhan who recorded the plight of patients and the shortage of hospital supplies, vanished, according to his friends.
  • South Korea quarantined hundreds of soldiers who visited China.
19More

The Epidemic of Facelessness - NYTimes.com - 1 views

  • The fact that the case ended up in court is rare; the viciousness it represents is not. Everyone in the digital space is, at one point or another, exposed to online monstrosity, one of the consequences of the uniquely contemporary condition of facelessness.
  • There is a vast dissonance between virtual communication and an actual police officer at the door. It is a dissonance we are all running up against more and more, the dissonance between the world of faces and the world without faces. And the world without faces is coming to dominate.
  • Inability to see a face is, in the most direct way, inability to recognize shared humanity with another. In a metastudy of antisocial populations, the inability to sense the emotions on other people’s faces was a key correlation. There is “a consistent, robust link between antisocial behavior and impaired recognition of fearful facial affect. Relative to comparison groups, antisocial populations showed significant impairments in recognizing fearful, sad and surprised expressions.”
  • ...16 more annotations...
  • the faceless communication social media creates, the linked distances between people, both provokes and mitigates the inherent capacity for monstrosity.
  • The Gyges effect, the well-noted disinhibition created by communications over the distances of the Internet, in which all speech and image are muted and at arm’s reach, produces an inevitable reaction — the desire for impact at any cost, the desire to reach through the screen, to make somebody feel something, anything. A simple comment can so easily be ignored. Rape threat? Not so much. Or, as Mr. Nunn so succinctly put it on Twitter: “If you can’t threaten to rape a celebrity, what is the point in having them?”
  • The challenge of our moment is that the face has been at the root of justice and ethics for 2,000 years.
  • The precondition of any trial, of any attempt to reconcile competing claims, is that the victim and the accused look each other in the face.
  • For the great French-Jewish philosopher Emmanuel Levinas, the encounter with another’s face was the origin of identity — the reality of the other preceding the formation of the self. The face is the substance, not just the reflection, of the infinity of another person. And from the infinity of the face comes the sense of inevitable obligation, the possibility of discourse, the origin of the ethical impulse.
  • “Through imitation and mimicry, we are able to feel what other people feel. By being able to feel what other people feel, we are also able to respond compassionately to other people’s emotional states.” The face is the key to the sense of intersubjectivity, linking mimicry and empathy through mirror neurons — the brain mechanism that creates imitation even in nonhuman primates.
  • it’s also no mere technical error on the part of Twitter; faceless rage is inherent to its technology.
  • Without a face, the self can form only with the rejection of all otherness, with a generalized, all-purpose contempt — a contempt that is so vacuous because it is so vague, and so ferocious because it is so vacuous. A world stripped of faces is a world stripped, not merely of ethics, but of the biological and cultural foundations of ethics.
  • The spirit of facelessness is coming to define the 21st. Facelessness is not a trend; it is a social phase we are entering that we have not yet figured out how to navigate.
  • the flight back to the face takes on new urgency. Google recently reported that on Android alone, which has more than a billion active users, people take 93 million selfies a day
  • Emojis are an explicit attempt to replicate the emotional context that facial expression provides. Intriguingly, emojis express emotion, often negative emotions, but you cannot troll with them.
  • But all these attempts to provide a digital face run counter to the main current of our era’s essential facelessness. The volume of digital threats appears to be too large for police forces to adequately deal with.
  • The more established wisdom about trolls, at this point, is to disengage. Obviously, in many cases, actual crimes are being committed, crimes that demand confrontation, by victims and by law enforcement officials, but in everyday digital life engaging with the trolls “is like trying to drown a vampire with your own blood,”
  • There is a third way, distinct from confrontation or avoidance: compassion
  • we need a new art of conversation for the new conversations we are having — and the first rule of that art must be to remember that we are talking to human beings: “Never say anything online that you wouldn’t say to somebody’s face.” But also: “Don’t listen to what people wouldn’t say to your face.”
  • The neurological research demonstrates that empathy, far from being an artificial construct of civilization, is integral to our biology.
26More

How Tech Can Turn Doctors Into Clerical Workers - The New York Times - 0 views

  • what I see in my colleague is disillusionment, and it has come too early, and I am seeing too much of it.
  • In America today, the patient in the hospital bed is just the icon, a place holder for the real patient who is not in the bed but in the computer. That virtual entity gets all our attention. Old-fashioned “bedside” rounds conducted by the attending physician too often take place nowhere near the bed but have become “card flip” rounds
  • My young colleague slumping in the chair in my office survived the student years, then three years of internship and residency and is now a full-time practitioner and teacher. The despair I hear comes from being the highest-paid clerical worker in the hospital: For every one hour we spend cumulatively with patients, studies have shown, we spend nearly two hours on our primitive Electronic Health Records, or “E.H.R.s,” and another hour or two during sacred personal time.
  • ...23 more annotations...
  • The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know
  • How we salivated at the idea of searchable records, of being able to graph fever trends, or white blood counts, or share records at a keystroke with another institution — “interoperability”
  • I can get cash and account details all over America and beyond. Yet I can’t reliably get a patient record from across town, let alone from a hospital in the same state, even if both places use the same brand of E.H.R
  • the leading E.H.R.s were never built with any understanding of the rituals of care or the user experience of physicians or nurses. A clinician will make roughly 4,000 keyboard clicks during a busy 10-hour emergency-room shift
  • In the process, our daily progress notes have become bloated cut-and-paste monsters that are inaccurate and hard to wade through. A half-page, handwritten progress note of the paper era might in a few lines tell you what a physician really thought
  • so much of the E.H.R., but particularly the physical exam it encodes, is a marvel of fiction, because we humans don’t want to leave a check box empty or leave gaps in a template.
  • For a study, my colleagues and I at Stanford solicited anecdotes from physicians nationwide about patients for whom an oversight in the exam (a “miss”) had resulted in real consequences, like diagnostic delay, radiation exposure, therapeutic or surgical misadventure, even death. They were the sorts of things that would leave no trace in the E.H.R. because the recorded exam always seems complete — and yet the omission would be glaring and memorable to other physicians involved in the subsequent care. We got more than 200 such anecdotes.
  • The reason for these errors? Most of them resulted from exams that simply weren’t done as claimed. “Food poisoning” was diagnosed because the strangulated hernia in the groin was overlooked, or patients were sent to the catheterization lab for chest pain because no one saw the shingles rash on the left chest.
  • I worry that such mistakes come because we’ve gotten trapped in the bunker of machine medicine. It is a preventable kind of failure
  • Our $3.4 trillion health care system is responsible for more than a quarter of a million deaths per year because of medical error, the rough equivalent of, say, a jumbo jet’s crashing every day.
  • Much of that is a result of poorly coordinated care, poor communication, patients falling through the cracks, knowledge not being transferred and so on, but some part of it is surely from failing to listen to the story and diminishing skill in reading the body as a text.
  • What if the computer gave the nurse the big picture of who he was both medically and as a person?
  • a professor at M.I.T. whose current interest in biomedical engineering is “bedside informatics,” marvels at the fact that in an I.C.U., a blizzard of monitors from disparate manufacturers display EKG, heart rate, respiratory rate, oxygen saturation, blood pressure, temperature and more, and yet none of this is pulled together, summarized and synthesized anywhere for the clinical staff to use
  • What these monitors do exceedingly well is sound alarms, an average of one alarm every eight minutes, or more than 180 per patient per day. What is our most common response to an alarm? We look for the button to silence the nuisance because, unlike those in a Boeing cockpit, say, our alarms are rarely diagnosing genuine danger.
  • By some estimates, more than 50 percent of physicians in the United States have at least one symptom of burnout, defined as a syndrome of emotional exhaustion, cynicism and decreased efficacy at work
  • It is on the increase, up by 9 percent from 2011 to 2014 in one national study. This is clearly not an individual problem but a systemic one, a 4,000-key-clicks-a-day problem.
  • The E.H.R. is only part of the issue: Other factors include rapid patient turnover, decreased autonomy, merging hospital systems, an aging population, the increasing medical complexity of patients. Even if the E.H.R. is not the sole cause of what ails us, believe me, it has become the symbol of burnou
  • burnout is one of the largest predictors of physician attrition from the work force. The total cost of recruiting a physician can be nearly $90,000, but the lost revenue per physician who leaves is between $500,000 and $1 million, even more in high-paying specialties.
  • I hold out hope that artificial intelligence and machine-learning algorithms will transform our experience, particularly if natural-language processing and video technology allow us to capture what is actually said and done in the exam room.
  • as with any lab test, what A.I. will provide is at best a recommendation that a physician using clinical judgment must decide how to apply.
  • True clinical judgment is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done.
  • The seriously ill patient has entered another kingdom, an alternate universe, a place and a process that is frightening, infantilizing; that patient’s greatest need is both scientific state-of-the-art knowledge and genuine caring from another human being. Caring is expressed in listening, in the time-honored ritual of the skilled bedside exam — reading the body — in touching and looking at where it hurts and ultimately in localizing the disease for patients not on a screen, not on an image, not on a biopsy report, but on their bodies.
  • As he was nearing death, Avedis Donabedian, a guru of health care metrics, was asked by an interviewer about the commercialization of health care. “The secret of quality,” he replied, “is love.”/•/
29More

The Science of Snobbery: How We're Duped Into Thinking Fancy Things Are Better - The At... - 0 views

  • Expert judges and amateurs alike claim to judge classical musicians based on sound. But Tsay’s research suggests that the original judges, despite their experience and expertise, judged the competition (which they heard and watched live) based on visual information, just as amateurs do.
  • just like with classical music, we do not appraise wine in the way that we expect.&nbsp;
  • Priceonomics revisited this seemingly damning research: the lack of correlation between wine enjoyment and price in blind tastings, the oenology students tricked by red food dye into describing a white wine like a red, a distribution of medals at tastings equivalent to what one would expect from pure chance, the grand crus described like cheap wines and vice-versa when the bottles are switched.
  • ...26 more annotations...
  • Taste does not simply equal your taste buds. It draws on information from all our senses as well as context. As a result, food is susceptible to the same trickery as wine. Adding yellow food dye to vanilla pudding leads people to experience a lemony taste. Diners eating in the dark at a chic concept restaurant confuse veal for tuna. Branding, packaging, and price tags are equally important to enjoyment. Cheap fish is routinely passed off as its pricier cousins at seafood and sushi restaurants.&nbsp;
  • Just like with wine and classical music, we often judge food based on very different criteria than what we claim. The result is that our perceptions are easily skewed in ways we don’t anticipate.&nbsp;
  • What does it mean for wine that presentation so easily trumps the quality imbued by being grown on premium Napa land or years of fruitful aging? Is it comforting that the same phenomenon is found in food and classical music, or is it a strike against the authenticity of our enjoyment of them as well? How common must these manipulations be until we concede that the influence of the price tag of a bottle of wine or the visual appearance of a pianist is not a trick but actually part of the quality?
  • To answer these questions, we need to investigate the underlying mechanism that leads us to judge wine, food, and music by criteria other than what we claim to value. And that mechanism seems to be the quick, intuitive judgments our minds unconsciously make
  • this unknowability also makes it easy to be led astray when our intuition makes a mistake. We may often be able to count on the price tag or packaging of food and wine for accurate information about quality. But as we believe that we’re judging based on just the product, we fail to recognize when presentation manipulates our snap judgments.
  • Participants were just as effective when watching 6 second video clips and when comparing their ratings to ratings of teacher effectiveness as measured by actual student test performance.&nbsp;
  • The power of intuitive first impressions has been demonstrated in a variety of other contexts. One&nbsp;experiment&nbsp;found that people predicted the outcome of political elections remarkably well based on silent 10 second video clips of debates - significantly outperforming political pundits and predictions made based on economic indicators.
  • In a&nbsp;real world case,&nbsp;a number of art experts successfully identified a 6th century Greek statue as a fraud. Although the statue had survived a 14 month investigation by a respected museum that included the probings of a geologist, they instantly recognized something was off. They just couldn’t explain how they knew.
  • Cases like this represent the canon behind the idea of the “adaptive unconscious,” a concept made famous by journalist Malcolm Gladwell in his book&nbsp;Blink. The basic idea is that we constantly, quickly, and unconsciously do the equivalent of judging a book by its cover. After all, a cover provides a lot of relevant information in a world in which we don’t have time to read every page.
  • Gladwell describes the adaptive unconscious as “a kind of giant computer that quickly and quietly processes a lot of the data we need in order to keep functioning as human beings.”
  • In a famous experiment, psychologist Nalini Ambady provided participants in an academic study with 30 second silent video clips of a college professor teaching a class and asked them to rate the effectiveness of the professor.
  • In follow up experiments, Chia-Jung Tsay found that those judging musicians’ auditions based on visual cues were not giving preference to attractive performers. Rather, they seemed to look for visual signs of relevant characteristics like passion, creativity, and uniqueness. Seeing signs of passion is valuable information. But in differentiating between elite performers, it gives an edge to someone who looks passionate over someone whose play is passionate
  • Outside of these more eccentric examples, it’s our reliance on quick judgments, and ignorance of their workings, that cause people to act on ugly, unconscious biases
  • It’s also why - from a business perspective - packaging and presentation is just as important as the good or service on offer. Why marketing is just as important as product.&nbsp;
  • Gladwell ends&nbsp;Blink&nbsp;optimistically. By paying closer attention to our powers of rapid cognition, he argues, we can avoid its pitfalls and harness its powers. We can blindly audition musicians behind a screen, look at a piece of art devoid of other context, and pay particular attention to possible unconscious bias in our performance reports.
  • But Gladwell’s success in demonstrating how the many calculations our adaptive unconscious performs without our awareness undermines his hopeful message of consciously harnessing its power.
  • As a former world-class tennis player and coach of over 50 years, Braden is a perfect example of the ideas behind thin slicing. But if he can’t figure out what his unconscious is up to when he recognizes double faults, why should anyone else expect to be up to the task?
  • flawed judgment in fields like medicine and investing has more serious consequences. The fact that expertise is so tricky leads psychologist Daniel Kahneman to assert that most experts should seek the assistance of statistics and algorithms in making decisions.
  • In his book&nbsp;Thinking, Fast and Slow, he describes our two modes of thought: System 1, like the adaptive unconscious, is our “fast, instinctive, and emotional” intuition. System 2 is our “slower, more deliberative, and more logical” conscious thought. Kahneman believes that we often leave decisions up to System 1 and generally place far “too much confidence in human judgment” due to the pitfalls of our intuition described above.
  • Not every judgment will be made in a field that is stable and regular enough for an algorithm to help us make judgments or predictions. But in those cases, he notes, “Hundreds of studies have shown that wherever we have sufficient information to build a model, it will perform better than most people.”
  • Experts can avoid the pitfalls of intuition more easily than laypeople. But they need help too, especially as our collective confidence in expertise leads us to overconfidence in their judgments.&nbsp;
  • This article has referred to the influence of price tags and context on products and experiences like wine and classical music concerts as tricks that skew our perception. But maybe we should consider them a real, actual part of the quality.
  • Losing ourselves in a universe of relativism, however, will lead us to miss out on anything new or unique. Take the example of the song “Hey Ya!” by Outkast. When the music industry heard it, they felt sure it would be a hit. When it premiered on the radio, however, listeners changed the channel. The song sounded too&nbsp;dissimilar&nbsp;from songs people liked, so they responded negatively.&nbsp;
  • It took time for people to get familiar with the song and realize that they enjoyed it. Eventually “Hey Ya!” became the hit of the summer.
  • Many boorish people talking about the ethereal qualities of great wine probably can't even identify&nbsp;cork taint&nbsp;because their impressions are dominated by the price tag and the wine label. But the classic defense of wine - that you need to study it to appreciate it - is also vindicated. The open question - which is both editorial and empiric - is what it means for the industry that constant&nbsp;vigilance and substantial study is needed to dependably appreciate wine for the product quality alone. But the questions is relevant to the enjoyment of many other products and experiences that we enjoy in life.
  • Maybe the most important conclusion is to not only recognize the fallibility of our judgments and impressions, but to recognize when it matters, and when it doesn’t
19More

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
6More

In Fiery Protest, Italian Museum Sets Art Ablaze : NPR - 0 views

  • Manfredi's "art war" consists of setting works of art on fire to protest cuts to Italy's arts budget. He's pledged to incinerate two or three pieces of art each week from a museum collection housing about 1,000 exhibits.
  • The budgets of state-run museums, archaeological sites and libraries are among the hardest hit.
  • not just about funding, but also an appeal for moral help and attention from authorities.
  • ...3 more annotations...
  • "We want the institutions in Italy and around the world to understand that the culture is very important," he says. "And it's not possible when there is an economic problem in the world, [that] the first that the government destroys is the art."
  • Italian government spending on the arts has been slashed by some 76 percent over the past two years.
  • during the recession — when people don't have money to buy gasoline — the number of visitors to museums and archaeological sites is actually growing. Resca looks to the Greek philosopher Aristotle to explain the phenomenon. "He said that during successful period[s], culture was an ornament," Resca says. "In bad periods, culture is a big shelter.
‹ Previous 21 - 40 of 335 Next › Last »
Showing 20 items per page